1,914 research outputs found

    Unveiling the Biometric Potential of Finger-Based ECG Signals

    Get PDF
    The ECG signal has been shown to contain relevant information for human identification. Even though results validate the potential of these signals, data acquisition methods and apparatus explored so far compromise user acceptability, requiring the acquisition of ECG at the chest. In this paper, we propose a finger-based ECG biometric system, that uses signals collected at the fingers, through a minimally intrusive 1-lead ECG setup recurring to Ag/AgCl electrodes without gel as interface with the skin. The collected signal is significantly more noisy than the ECG acquired at the chest, motivating the application of feature extraction and signal processing techniques to the problem. Time domain ECG signal processing is performed, which comprises the usual steps of filtering, peak detection, heartbeat waveform segmentation, and amplitude normalization, plus an additional step of time normalization. Through a simple minimum distance criterion between the test patterns and the enrollment database, results have revealed this to be a promising technique for biometric applications

    NETCU: analising e-Learning neworked curricula in Europe: the importance of legal and quality assurance aspects

    Get PDF
    Conferência realizada no Porto de 6-9 de junho de 2012info:eu-repo/semantics/publishedVersio

    ECG-based biometrics: A real time classification approach

    Get PDF
    Behavioral biometrics is one of the areas with growing interest within the biosignal research community. A recent trend in the field is ECG-based biometrics, where electrocardiographic (ECG) signals are used as input to the biometric system. Previous work has shown this to be a promising trait, with the potential to serve as a good complement to other existing, and already more established modalities, due to its intrinsic characteristics. In this paper, we propose a system for ECG biometrics centered on signals acquired at the subject's hand. Our work is based on a previously developed custom, non-intrusive sensing apparatus for data acquisition at the hands, and involved the pre-processing of the ECG signals, and evaluation of two classification approaches targeted at real-time or near real-time applications. Preliminary results show that this system leads to competitive results both for authentication and identification, and further validate the potential of ECG signals as a complementary modality in the toolbox of the biometric system designer

    Mid2p stabilizes septin rings during cytokinesis in fission yeast

    Get PDF
    Septins are filament-forming proteins with a conserved role in cytokinesis. In the fission yeast Schizosaccharomyces pombe, septin rings appear to be involved primarily in cell–cell separation, a late stage in cytokinesis. Here, we identified a protein Mid2p on the basis of its sequence similarity to S. pombe Mid1p, Saccharomyces cerevisiae Bud4p, and Candida albicans Int1p. Like septin mutants, mid2Δ mutants had delays in cell–cell separation. mid2Δ mutants were defective in septin organization but not contractile ring closure or septum formation. In wild-type cells, septins assembled first during mitosis in a single ring and during septation developed into double rings that did not contract. In mid2Δ cells, septins initially assembled in a single ring but during septation appeared in the cleavage furrow, forming a washer or disc structure. FRAP studies showed that septins are stable in wild-type cells but exchange 30-fold more rapidly in mid2Δ cells. Mid2p colocalized with septins and required septins for its localization. A COOH-terminal pleckstrin homology domain of Mid2p was required for its localization and function. No genetic interactions were found between mid2 and the related gene mid1. Thus, these studies identify a new factor responsible for the proper stability and function of septins during cytokinesis

    Human-assisted vs. deep learning feature extraction: an evaluation of ECG features extraction methods for arrhythmia classification using machine learning

    Get PDF
    The success of arrhythmia classification tasks with Machine Learning (ML) algorithms is based on the handcrafting extraction of features from Electrocardiography (ECG) signals. However, feature extraction is a time-consuming trial-and-error approach. Deep Neural Network (DNN) algorithms bypass the process of handcrafting feature extraction since the algorithm extracts the features automatically in their hidden layers. However, it is important to have access to a balanced dataset for algorithm training. In this exploratory research study, we will compare the evaluation metrics among Convolutional Neural Networks (1D-CNN) and Support Vector Machines (SVM) using a dataset based on the merged public ECG signals database TNMG and CINC17 databases. Results: Both algorithms showed good performance using the new, merged ECG database. For evaluation metrics, the 1D-CNN algorithm has a precision of 93.04%, an accuracy of 93.07%, a recall of 93.20%, and an F1-score of 93.05%. The SVM classifier (λ = 10, C = 10 × 109) achieved the best classification metrics with two combined, handcrafted feature extraction methods: Wavelet transforms and R-peak Interval features, which achieved an overall precision of 89.04%, accuracy of 92.00%, recall of 94.20%, and F1-score of 91.54%. As an unique input feature and SVM (λ=10,C=100), wavelet transforms achieved precision, accuracy, recall, and F1-score metrics of 86.15%, 85.33%, 81.16%, and 83.58%. Conclusion: Researchers face a challenge in finding a broad dataset to evaluate ML models. One way to solve this problem, especially for deep learning models, is to combine several public datasets to increase the amount of data. The SVM and 1D-CNN algorithms showed positive results with the merge of databases, showing similar F1-score, precision, and recall during arrhythmia classification. Despite the favorable results for both of them, it should be considered that in the SVM, feature selection is a time-consuming trial-and-error process; meanwhile, CNN algorithms can reduce the workload significantly. The disadvantage of CNN algorithms is that it has a higher computational processing cost; moreover, in the absence of access to powerful computational processing, the SVM can be a reliable solution.“FCT–Fundação para a Ciência e Tecnologia” within the R&D Units Project Scope: UIDB/00319/2020

    THE INNOVATION OF THE FAIR TRADE MOVEMENT TO FOSTER SUSTAINABILITY AIMS

    Get PDF
    Grande parte da conscientização da sociedade em relação aos objetivos de desenvolvimento sustentável foi fomentada pelos programas das Nações Unidas (ONU), organizações não-governamentais e movimentos sociais que eles inspiraram. Dentro do fluxo de mudanças sociais ocorridas após a Segunda Guerra Mundial, a iniciativa de comércio justo inovou como movimento social, oferecendo um modelo de comércio internacional para fazer a diferença na vida dos produtores. As principais organizações de comércio justo trouxeram valores de responsabilidade social ao abordar metas como o alívio da pobreza; redução das desigualdades de mercado Norte-Sul; proteção do meio ambiente; condições justas de trabalho; promoção do consumo e produção responsáveis; e segurança alimentar. Atendendo a esses objetivos, o movimento de comércio justo pode ser alinhado aos Objetivos de Desenvolvimento Sustentável (ODS) estabelecidos pela ONU em 2015; e com as três dimensões da sustentabilidade. Essas sinergias podem ser demonstradas nos relatórios de responsabilidade social e sustentabilidade das organizações de comércio justo. Os materiais e métodos deste artigo incluíram uma revisão dos relatórios de responsabilidade social corporativa e sustentabilidade das principais organizações de comércio justo desde 2000 até a presente data. Os resultados mostram uma consistência entre os termos comuns aos objetivos relatados sobre o comércio justo e os ODS. Uma análise comparativa indica o espectro de tópicos de sustentabilidade abordados progressivamente pelo movimento de comércio justo desde pelo menos o ano 2000. Esta revisão pode contribuir para orientar políticas governamentais e empresas com foco social para promover metas de sustentabilidade por meio de inovações nos sistemas alimentares, contribuindo para uma agricultura sustentável e o desenvolvimento rural.

    Ecg biometrics using deep learning and relative score threshold classification

    Get PDF
    PD/BDE/130216/2017The field of biometrics is a pattern recognition problem, where the individual traits are coded, registered, and compared with other database records. Due to the difficulties in reproducing Electrocardiograms (ECG), their usage has been emerging in the biometric field for more secure applications. Inspired by the high performance shown by Deep Neural Networks (DNN) and to mitigate the intra-variability challenges displayed by the ECG of each individual, this work proposes two architectures to improve current results in both identification (finding the registered person from a sample) and authentication (prove that the person is whom it claims) processes: Temporal Convolutional Neural Network (TCNN) and Recurrent Neural Network (RNN). Each architecture produces a similarity score, based on the prediction error of the former and the logits given by the last, and fed to the same classifier, the Relative Score Threshold Classifier (RSTC).The robustness and applicability of these architectures were trained and tested on public databases used by literature in this context: Fantasia, MIT-BIH, and CYBHi databases. Results show that overall the TCNN outperforms the RNN achieving almost 100%, 96%, and 90% accuracy, respectively, for identification and 0.0%, 0.1%, and 2.2% equal error rate (EER) for authentication processes. When comparing to previous work, both architectures reached results beyond the state-of-the-art. Nevertheless, the improvement of these techniques, such as enriching training with extra varied data and transfer learning, may provide more robust systems with a reduced time required for validation.publishersversionpublishe

    Definition of MV Load Diagrams via Weighted Evidence Accumulation Clustering using Subsampling

    Get PDF
    A definition of medium voltage (MV) load diagrams was made, based on the data base knowledge discovery process. Clustering techniques were used as support for the agents of the electric power retail markets to obtain specific knowledge of their customers’ consumption habits. Each customer class resulting from the clustering operation is represented by its load diagram. The Two-step clustering algorithm and the WEACS approach based on evidence accumulation (EAC) were applied to an electricity consumption data from a utility client’s database in order to form the customer’s classes and to find a set of representative consumption patterns. The WEACS approach is a clustering ensemble combination approach that uses subsampling and that weights differently the partitions in the co-association matrix. As a complementary step to the WEACS approach, all the final data partitions produced by the different variations of the method are combined and the Ward Link algorithm is used to obtain the final data partition. Experiment results showed that WEACS approach led to better accuracy than many other clustering approaches. In this paper the WEACS approach separates better the customer’s population than Two-step clustering algorithm
    corecore